Demonstrably Doing Accountability in the Internet of Things
نویسندگان
چکیده
This paper explores the importance of accountability to data protection, and how it can be built into the Internet of Things (IoT). The need to build accountability into the IoT is motivated by the opaque nature of distributed data flows, inadequate consent mechanisms, and lack of interfaces enabling end-user control over the behaviours of internet-enabled devices. The lack of accountability precludes meaningful engagement by end-users with their personal data and poses a key challenge to creating user trust in the IoT and the reciprocal development of the digital economy. The EU General Data Protection Regulation 2016 (GDPR) seeks to remedy this particular problem by mandating that a rapidly developing technological ecosystem be made accountable. In doing so it foregrounds new responsibilities for data controllers, including data protection by design and default, and new data subject rights such as the right to data portability. While GDPR is ‘technologically neutral’, it is nevertheless anticipated that realising the vision will turn upon effective technological development. Accordingly, this paper examines the notion of accountability, how it has been translated into systems design recommendations for the IoT, and how the IoT Databox puts key data protection principles into practice. INTRODUCTION The ‘connected home’ currently sits at the ‘peak of inflated expectations’ in Gartner’s often cited hype cycle, and the Internet of Things (IoT) is a key driver of the hype. A cursory glance at the consumer IoT market reveals swathes of household goods with the prefix ‘smart’ or ‘intelligent’ on offer, spanning white good to fixtures and fittings embedded in the fabric of the home. The promise of the IoT is greater convenience, security, safety, efficiency and comfort in a user’s everyday life. While the necessity of many IoT products and services may be questionable, anticipated growth in the sector is vast: major IT firms like Cisco, Ericsson, General Electric and Accenture all predict billions of networked devices in the coming years. The IoT essentially trades on data, both actively and passively, with inputs ranging from explicit spoken voice commands to sensed data inputs implicated in such things as movement or temperature monitoring. The IoT also aligns with other trends in computing, particularly big data, cloud computing and machine learning, with personal data collected by IoT devices typically being distributed to the cloud for processing and analytics. 1 Kasey Panetta, “Top Trends in the Gartner Hype Cycle for Emerging Technologies, 2017 Smarter With Gartner,” Gartner, 2017, http://www.gartner.com/smarterwithgartner/top-trends-in-the-gartner-hype-cycle-foremerging-technologies-2017/. 2 http://iotlist.co 3 Matthew Reynolds, “These Bizarre Connected Devices Really Shouldn’t Exist,” Wired UK, 2017, http://www.wired.co.uk/article/strangest-internet-of-things-devices.; The Internet of Useless Things Website: http://www.internetofuselessthings.io 4 Cisco, “The Internet of Everything” (San Jose, 2013), http://www.cisco.com/c/dam/en_us/about/businessinsights/docs/ioe-value-at-stake-public-sector-analysis-faq.pdf; Louis Columbus, “Roundup of the Internet of Things Forecasts and Market Estimates, 2016,” Forbes Tech, 2016, 2, doi:http://www.forbes.com/sites/louiscolumbus/2016/03/13/roundup-of-cloud-computing-forecasts-and-marketestimates-2016/#3a6bfd6e74b0. Accompanying the diversity of IoT devices and services are concerns centring on privacy and trust. When sensing occurs in the home, for example, patterns of behaviour can be detected and inferences made about inhabitants’ lifestyles. Depending who is making these inferences, and who they share the data with, privacy harms can emerge. As Nissenbaum argues, inappropriate flows of information between contexts can cause harm to an individual’s sense of privacy. 5 The nascent nature of the industry means there is a lack of harmonised standards for building IoT devices in ways that sufficiently foreground and anticipate data protection concerns. Building trustworthy relationships with consumers in the new IoT infrastructure is critical, and not least because an increasing array of high profile stories about IoT devices leaking data, or being hacked and becoming implicated in widespread distributed denial of service attacks, contribute to a diminishing sense of trust in the emerging infrastructure. Against this background we elaborate key challenges posed by the IoT from a regulatory perspective and how these practically occasion the need for accountability. These include challenges posed by devices that lack or only provide partial user interfaces and compliant consent mechanisms; the opacity of data flows to end-users and the spectrum of GDPR control rights; machine to machine communications and the legitimacy of access; and cloud storage and international data transfer safeguards. We move on to explore various aspects of the Accountability Principle, first its history in data protection governance and then how it is presented in Article 5(2) of GDPR. This exploration involves questioning the nature of the account to be provided, how it is to be provided, and to whom. We situate Article 5(2) within the wider context of GDPR, turning to various requirements of Article 24 as interpreted in GDPR recitals, and other related Articles, to map how they intersect with accountability. The requirements of GDPR pose distinct challenges to the development of technological systems and we subsequently turn to consider the recommendations of the Article 29 Working Party, and how they envisage GDPR playing out in the IoT, as a preface to presenting the IoT Databox. We conclude by mapping how the IoT Databox addresses the different accountability requirements of GDPR. THE PRACTICAL NEED TO BUILD ACCOUNTABILITY INTO THE IoT From May 2018, GDPR will be enforced across all European Union member states. It will also affect data controllers outside Europe if they target goods and services to, or otherwise 5 Helen Fay. Nissenbaum, Privacy in Context : Technology, Policy, and the Integrity of Social Life (Stanford Law Books, 2010). 6 Ian Brown, “GSR Discussion Paper Regulation and the Internet of Things,” International Telecommunications Union (Geneva, 2015); K Rose, “Internet of Things: An Overview,” JOUR, Geneva: Internet Society, 2015. 7 Onora O’Neill TED Talkhttps://www.ted.com/talks/onora_o_neill_what_we_don_t_understand_about_trust; UK Digital Catapult Personal Data and Trust Network https://pdtn.org (bring together businesses and academic in network to consider issues surrounding trust in IT) ; Gilad Rosner, Privacy and the Internet of Things (O’Reilly Media, 2016), http://www.oreilly.com/iot/free/privacy-and-the-iot.csp. 8 Richard Chirgwin, ‘CloudPets’ Woes Worsen: Webpages Can Turn Kids’ Stuffed Toys into Creepy Audio Bugs’ (The Register, 2017) accessed 16 October 2017.Samuel Gibbs, “Hackers Can Hijack Wi-Fi Hello Barbie to Spy on Your Children,” The Guardian, 2015, https://www.theguardian.com/technology/2015/nov/26/hackers-can-hijack-wi-fi-hello-barbie-to-spy-onyour-children. Cory Doctorow, “Kids’ Smart Watches Are a Security/privacy Dumpster-Fire / Boing Boing,” BoingBoing, 2017, https://boingboing.net/2017/10/21/remote-listening-device.html. 9 Brian Krebs, “Hacked Cameras, DVRs Powered Today’s Massive Internet Outage,” Krebs on Security, 2016, https://krebsonsecurity.com/2016/10/hacked-cameras-dvrs-powered-todays-massive-internet-outage/. ; 10 European Commission, “REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC” (Brussels: Official Journal of European Union L119 Vol 59, 2016). monitor, EU citizens. Seeking to bring data protection laws into the 21 Century, GDPR replaces the pre-Internet Data Protection Directive 1995. The IoT sector is heavily driven by personal data, meaning it is critical that IoT developers negotiate their relationship with the new user rights and controller responsibilities mandated by GDPR. This includes a raft of fresh legal rules governing the processing of personal data, along with extension of the rights provided to data subjects and the responsibilities incumbent on data controllers, all of which are impacted by the underlying technological infrastructure. Lack of or partial user interfaces and consent The design of IoT devices is heterogeneous. Unlike mobile phones, where users can develop mental models about how these devices work, ‘interfaces’ to the IoT vary immensely. Many IoT devices do not have screens and communication with users relies instead on lights or sounds or haptic feedback; text notifications to mobile phones may also be leveraged in the absence of direct device feedback occasioned by the desire to create aesthetically pleasing devices, which may in turn result in opacity about device functionality. This diversity makes it hard for users to understand what personal information is being collected and how it is being used. From a regulatory perspective, this shapes the nature of consent mechanisms. Consent is one legal basis for processing personal data. Consent follows a notice and choice model, meaning it should be informed, unambiguous, freely given and specific to a particular process, and enable a clear indication of the data subject’s will. Data subjects need to affirm their choice, and if the type of data being processed is within special categories of personal data (e.g., health, gender, race, or biometric information) explicit consent is needed. Such consent cannot be obtained through pre-ticked boxes, silence or inactivity by the subject. The dominant web-based model takes advantage of the affordances of mobile devices, using screens to display privacy policies, and terms and condition contracts containing large blocks of text. Extensive research shows users do not read this text, as it would take an incredibly long time to do so hence they often agree in any case. Even if they did read it, they cannot renegotiate as it is a form contract and may not understand it due to complex literacy requirements. This situation is not ideal and challenges the notion of legally compliant consent. The heterogeneity of IoT devices could be good or bad for consent processes. On the one hand, consent could be frustrated by devices which, by design, ambiently collect data and have interfaces that lack affordances for communicating clear information. This could be particularly challenging for homes, where children and adults cohabit, as GDPR introduces stricter requirements about delivering clear, concise, comprehensible information to children about data processing. However, on the other hand, the IoT poses an opportunity to redesign how consent is done with users. Taking advantage of new interaction methods may provide for the ongoing negotiation of terms of consent. 11 Article 3(2) GDPR 12 Martina Ziefle and Susanne Bay, “Mental Models of a Cellular Phone Menu . Comparing Older and Younger Novice Users,” Brewster, S., Dunlop, M. (Eds.), Mobile Human Computer Interaction. LNCS 3160. Springer, Berlin, Germany, 2004, 25–37, doi:10.1007/978-3-540-28637-0_3. 13 Article 2(h) DPD; Art 4(11) GDPR 14 Recital 32 GDPR 2016 15 Aleecia M Mcdonald and Lorrie Faith Cranor, “The Cost of Reading Privacy Policies,” I/S A Journal of Law and Policy for the Information Society, 2008, http://www.is-journal.org/. 16 Ewa Luger, Stuart Moran, and Tom Rodden, “Consent for All,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI ’13 (New York, New York, USA: ACM Press, 2013), 2687, doi:10.1145/2470654.2481371. 17 Article 12 GDPR 18 Ewa Luger and Tom Rodden, “The Value of Consent: Discussions with Designers of Ubiquitous Computing Systems,” in 2014 IEEE International Conference on Pervasive Computing and Communication Workshops, Opacity of data flows to end users and control IoT devices and the digital ecosystems they feed into are largely opaque in how they handle data. Insofar as end-users may struggle to understand how their devices work, given the lack of effective interfaces, this may in turn lead to lack of legibility in how data is being processed, why, by whom, where it is being stored, for how long etc. This has the knock-on effect of making it hard for users exercise their legal rights and to control use of their information. While no hierarchical framing of rights is encoded in GDPR, a spectrum of various control rights enabling data subjects to escalate action from controllers is nevertheless discernible and underpin accountability in GDPR: • Article 15 the ‘right to access’, or the right to discover what data is held by the controller about the data subject. • Article 16 the ‘right to rectification’, or the right to correct erroneous data held by the controller. • Article 21 the ‘right to object’, or the right to object to the processing of data by the controller. • Article 18 the ‘right to restriction of processing’, or the right to require the controller to restrict processing of data. • Article 20 the ‘right to data portability’, or the right to have a controller provide data to the data subject in a commonly used, machine readable format to take to another controller. • Article 17 the ‘right to erasure’, or the right to have data deleted by controller and for the data subject to thereby be forgotten. Each of these control rights occasions practical challenges of implementation. If we take data portability, for example, how can data from sensors be moved between IoT service providers in a usable way? Equally challenging and key to control is the need to surface and make visible what information is being processed in the first place. Machine to machine communications and access The connected home consists of a network of connected devices, many of which may interact with one another. We already see this commercially, with home management system like ‘Works with Nest’ or Apple’s ‘HomeKit’ linking together manufacture devices and third-party offerings. However, and again due to the paucity of interfaces to the IoT, the lack of human oversight in machine to machine (M2M) communications makes it hard for users to know what is being shared between devices, and if this is contextually appropriate or not. A good example is sensitive personal data collected by a smart mirror detecting someone’s skin condition or smart bathroom scales sensing rapid weight loss over time indicating health conditions . PERCOM WORKSHOPS 2014, 2014, 388–93, doi:10.1109/PerComW.2014.6815237; Lachlan Urquhart and Tom Rodden, “New Directions in Information Technology Law: Learning from Human–computer Interaction,” International Review of Law, Computers & Technology 31, no. 2 (May 4, 2017): 150–69, doi:10.1080/13600869.2017.1298501. 19 Peter Tolmie et al., “‘ This Has to Be the Cats ’ Personal Data Legibility in Networked Sensing Systems,” in CSCW 16 (San Francisco, 2016), 491–502, doi:10.1145/2818048.2819992. 20 For more detail see Lachlan Urquhart, Neelima Sailaja, and Derek McAuley, “Realising the Right to Data Portability for the Domestic Internet of Things,” Personal and Ubiquitous Computing, August 23, 2017, doi:10.1007/s00779-017-1069-2. 21 Withings Smart Body Analyser; SEMEOTICONS Wize Mirror – FP7 Project Ideally, to respect the agency of users and build their trust, this should not be shared with a health insurance mandated wearable health tracker, unless the user wants it to. Similarly, access by an Amazon Dash inspired replenishment button, perhaps sponsored by a pharmaceutical firm pushing a new skincare range, should have human oversight too. The challenge here is balancing the movement of personal data, utility from devices, business models and ensuring legitimate access to data by different devices and services. By limiting the role of users in the loop, it becomes harder to know if appropriate access is being given (or not) by devices. Linking datasets without adequate access management could also have impacts for data controllers, who need to ensure compliance with DP rules, and users, who may suffer information privacy harms through unexpected data sharing. Cloud storage and international data transfer The nature of remote, cloud based data storage utilised by most IoT devices is also problematic under GDPR. Services using IoT sensor data often store collected data in servers located outside of the EU. This enables businesses to create large datasets, used in training of machine learning algorithms and finding patterns that can be used either in service delivery, or creation of new services. Managing big data sets raises challenges addressing the velocity, variety, veracity and volume of data. From the perspective of ensuring GDPR compliance, users will struggle to know where their data is, or how they can access and control it when its storage location is likely unknown or geographically distant. Again, oversight over what it is being used for becomes difficult and from a legal perspective, issues of jurisdiction and applicable law in contract clauses can come to the fore. From a data protection stance, adequate protection of data when it leaves the EU is difficult, and measures to guarantee protection, like Privacy Shield (which replaced the former Safe Harbor agreement) or model contract clauses all have their flaws. Furthermore, as mentioned above, Article 3(2) expands the reach of GDPR for controllers outside of the EU monitoring or targeting goods and services towards EU citizens. Cloud providers may not be able to ignore the importance of GDPR in compliance. The alternative of local data storage, keeping information proximate to end users is preferable for ensuring their control over how it is processed, and ensuring more user centric, ethical IoT applications can emerge in the future. The IoT ecosystem, by design, is opaque, and its actions often invisible to end users. In contravention of DP law principles, interactions are being designed that provide little information about how devices function, what data is collected, and what trade-offs consumers are making in order to receive relevant services. This is not sustainable, and risks growth of the sector. It is for these reasons that we argue that accountability needs to be built into the IoT. But what exactly do we mean? ACCOUNTABILITY? We are of the view that the answer to many of the regulatory issues surfaced by IoT is to build accountability into products and services, by design. Increased dialogue between data controllers and data subjects is needed so that citizens can exercise better control over how their personal data is exploited in the digital economy. Due to GDPR, interest in accountability as a governance mechanism is growing. However, it remains a difficult concept to succinctly pin down. The accountability principle is only substantively mentioned once in GDPR, in 22 ICO, “Big Data, Artificial Intelligence, Machine Learning and Data Protection” (Wilmslow, 2017), https://ico.org.uk/media/for-organisations/documents/2013559/big-data-ai-ml-and-data-protection.pdf. 23 https://www.privacyshield.gov/welcome ; EDPS, Opinion on EU-US Privacy Shield (Brussels, 2016); Article 29 Working Party, Opinion 01/2016 on the EU-US Privacy Shield Draft Adequacy Decision (Brussels, 2016) Article 5(2). However, its implications quickly spiral when read in the wider context of GDPR, in conjunction with Article 24, various recitals, and other relevant Articles. Historically, there has been a strong relationship between accountability and data protection compliance. In this context, accountability has traditionally been invoked as a mechanism for implementing data protection principles. As Aldahoff et al. point out, “ ... even in instruments where accountability is not called out as a separate data protection principle, many of its substantive provisions were in fact designed to enable accountability”. The Article 29 Working Party has argued that accountability obliges data controllers to put in place effective policies and mechanisms to ensure compliance with data protection rules, a view endorsed by Aldahoff et al. who underscore the importance of making data processing entities answerable – of ‘calling them to account’ for the implementation of appropriate safeguards. The European Data Protection Supervisor (EDPS) argues accountability is not a prescriptive bureaucratic measure merely concerned with validation, but is about proactive leadership to foster a broad culture of accountability. The introduction of GDPR puts measures in place that further develop this culture of accountability. Adopting a similar framing of the accountability principle created 37 years ago in the OECD Guidelines on the Protection of Privacy and Trans-Border Flows of Personal Data 1980 (paragraph 14), GDPR states, “The controller shall be responsible and be able to demonstrate compliance with, paragraph 1 (‘accountability’).” Article 5(2) This means the controller is responsible for processing personal data in compliance with principles found in GDPR, which are themselves similar to OECD good DP governance principles (paragraphs 7-13). Art 5(1) GDPR includes: (a) lawfulness, fairness and transparency; b) purpose limitation; c) data minimisation; d) accuracy; e) storage limitation; f) integrity and confidentiality. Where OECD and GDPR differ is in the explicit requirement for demonstration of compliance with the different principles. Accordingly, there is a two-part responsibility on data controllers: firstly, to put the necessary measures in place to comply with Art 5(1), and secondly, to find ways to demonstrate they have complied. This could be viewed as firstly a ‘substantive compliance with principles’ requirement, and secondly as a ‘procedural demonstration of compliance to relevant stakeholders’ requirement. We shall revisit these distinctive aspects of accountability in due course. First we wish to consider what nature an account needs to take and to whom accountability should be demonstrated. 30 24 Excluding reference in recital 61 and 85 in the context of data breaches 25 Charles Raab, “The Meaning of ‘Accountability’ in the Information Privacy Context,” in Managing Privacy through Accountability (London: Palgrave Macmillan UK, 2012), 17, doi:10.1057/9781137032225_2. 26 Joseph Alhadeff, Brendan Van Alsenoy, and Jos Dumortier, “The Accountability Principle in Data Protection Regulation: Origin, Development and Future Directions,” in Managing Privacy through Accountability (London: Palgrave Macmillan UK, 2012), 6, doi:10.1057/9781137032225_4. 27 EU Article 29 Working Party, “ARTICLE 29 DATA PROTECTION WORKING PARTY Opinion 3 / 2010 on the Principle of Accountability,” A29 Working Party, 2010, 3, http://ec.europa.eu/justice/data-protection/article29/documentation/opinion-recommendation/files/2012/wp193_en.pdf. 28 Alhadeff, Van Alsenoy, and Dumortier, “The Accountability Principle in Data Protection Regulation: Origin, Development and Future Directions,” 19. 29 Ibid., 4. 30 Bennett, C. (2010) ‘International privacy standards: Can accountability ever be adequate?’, Privacy Laws & Business International Newsletter, Issue 106, pp. 21–3, p22 The nature of an account and to whom accountability must be demonstrated The current approach in GDPR of not explicitly defining what accountability requires of data controllers is intentional. This again follows OECD 1980 guidelines, which as Alhadeff et al. state, “ ... do not prescribe to whom the controller should be accountable (the ‘accountee’), nor what this relationship should look like.” In their 2010 Opinion on Accountability, the Article 29 Working Party (A29 WP) suggested that putting an explicit accountability principle into GDPR would enable case by case analysis of appropriate measures, and be preferable to predefining requirements due to this approach being more flexible and scalable. Seven years on, if we look to the most recent A29 WP Guidance on Data Protection Impact Assessments, it retains a non-prescriptive stance about measures needed for accountability, beyond publishing DPIAs and the obligation for record keeping. Lack of detailed prescriptive guidance around such a central concept is consistent with original OECD practice, and keeps accountability sufficiently flexible as a notion. Despite the virtues of flexibility, a sticking point for accountability in practice is the form a demonstrable account needs to take. In seeking to answer this, Raab argues that giving an account is akin to ‘telling a story’ and can be seen to operate at three sequential levels. At its most simple, accountability merely obliges an organisation to report back on its actions. The next level requires mechanisms for that story to be questioned, and for data subjects to offer their own. The third level puts sanctions in place for when an account is poor, either due to inaction or lack of a proper story being offered in the first place. Whilst this provides some abstract navigational aid, it does not pin down the precise dimensions of a good ‘account’. A series of European projects including Galway, Paris, and Madrid have been grappling with the nature of accountability. The Paris project document elaborates elements organisations need to put in place to demonstrate accountability. These largely consist of organisational measures, such as establishing policies based on relevant law, setting up internal bodies to enforce these, providing staff training on information privacy, analysing risks on a regular basis, setting up mechanisms to respond to customer complaints, and providing appropriate redress mechanisms. This sits against the wider work of the Galway project, which states accountability in general requires organisational buy in, particularly putting in place internal standards that correlate with external requirements; access to resources to support compliance with policies (training etc.); and internal oversight mechanisms to ensure adherence, coupled with approaches for appropriate sanctions and rule enforcement. 31 Party, “ARTICLE 29 DATA PROTECTION WORKING PARTY Opinion 3 / 2010 on the Principle of Accountability,” paras. 44–45; Alhadeff, Van Alsenoy, and Dumortier, “The Accountability Principle in Data Protection Regulation: Origin, Development and Future Directions,” 19. 32 Article 29 Working Party (2017) Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679, WP248 p12 33 Raab, “The Meaning of ‘Accountability’ in the Information Privacy Context,” 21. 34 Link to Galway Project http://www.informationpolicycentre.com 35 Link to Paris Project https://iapp.org/media/pdf/knowledge_center/Accountability_Phase_II.pdf 36 Link to Madrid Resolution – http://www.privacyconference2009.org/media/notas_prensa/common/pdfs/061109_estandares_internacionales_ en.pdf 37 M. Abrams, ‘Data Protection Accountability: The Essential Elements a Document for Discussion’, 2009, 4. Examining guidance offered by the European Data Protection Supervisor, and UK Information Commissioner Office, we also find a range of new measures in GDPR to assist with accountability requirements. We cluster these in terms of ‘technical’ or ‘organisational’ measures: Technical Measures Data protection by design and default; including use of anonymization, pseudonymisation and end to end encryption; IT security risk management. Organisational Measures Assigning data protection officers (DPOs); prior consultations; certification schemes; data protection impact assessments (DPIAs); transparent policies; documentation and record keeping on processing for organisations with over 250 staff; internal compliance and audits for effectiveness of approaches; training. GDPR thus puts in place a raft of new organisational and technical ‘responsibilities’ for controllers. Executing these responsibilities is not, as the EDPS puts it, simply a ‘box ticking exercise’.The challenge lies in implement these organisational and technical measures as a basis for demonstrating compliance. Thus, it is only through the work of doing compliance that accountability comes to life. As Ihde reminds us “Left on a shelf, the Swiss army knife or the cell phone ‘does’ nothing.” The same can be said for the measures mandated by GDPR. It is only when they are built into the everyday practice that complex negotiations between controller and user will emerge, and we can understand what an ‘account’ may demonstrably look like. It is equally difficult to succinctly pin down to whom accountability should be demonstrated. The Madrid Resolution attempts to set up international standards on accountability and states that demonstrations should be to supervisory authorities and data subjects. However, GDPR is not framed as narrowly. Whilst data subjects and supervisory authorities are clear stakeholders, Article 5(2) is not limited to them, and it is artificial to read Article 5 in isolation of the rest of GDPR, which places many other responsibilities on data controllers. Article 24 specifically focuses on the nature of their wider responsibilities: “Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure 38 G. Buttarelli “The accountability principle in the new GDPR” Speech at the European Court of Justice, Luxembourg, 30 September 2016 39 ICO website https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/accountabilityand-governance/ 40 Article 29 Working Party, “Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is ‘likely to Result in a High Risk’ for the Purposes of Regulation 2016/679” (Brussels, 2017). 41 Nature of what should be recorded about processing laid out in Article 30 GDPR. See Art 30(5) on conditions when organisations smaller than 250 persons need to also keep records – e.g. if they are handling special categories of personal data, information relating to criminal convictions etc; See also Recitals 13; 39 and 82 for more detail on reporting. 42 G. Buttarelli “The accountability principle in the new GDPR” Speech at the European Court of Justice, Luxembourg, 30 September 2016 43 Don Ihde, “Smart? Amsterdam Urinals and Autonomic Computing,” in Law Human Agency and Autonomic Computing: Philosophy of Law Meets the Philosophy of Computing, ed. Mirielle Hildebrandt and A Rouvroy (Routeledge, 2011). Madrid Resolution – http://www.privacyconference2009.org/media/notas_prensa/common/pdfs/061109_estandares_internacionales_ en.pdf and to be able to demonstrate that processing is performed in accordance with this Regulation.” Art 24(1) our emphasis Article 24 surfaces concepts that need to be read in conjunction with Art 5(2), to situate the full extent of data controller responsibilities in GDPR. We focus on the four key issues emphasised above. Nature, scope, context and purposes of processing, and risks of varying likelihood and severity. As these two elements are linked, we consider them side by side. In determining the ‘nature, scope, context and purposes of processing’, recital 76 GDPR states ‘objective risk assessment’ is necessary to establish the level of risk attendant to data processing, e.g., if it is risk or high risk. Recital 75 provides examples of particular kinds of risk occasioned by data processing, including when it results in discrimination, financial loss, identity theft or fraud, damage to reputation and reversal of pseudonyms, to name a few. Whilst Article 24 requires assessment of risk, in general, it does not call for a Data Protection Impact Assessments (DPIA) in all cases. However, the focus on risk analysis clearly links to Article 35 which requires a DPIA for processing ‘likely to result in high risks’. The nature of ‘high risk’ is explored in depth in newly released A29 WP guidance, which provides nine examples of high risk processing including processing of data concerning vulnerable data subjects, combining datasets, innovative use of data for new technological/organisational solutions, or preventing data subjects accessing a service. Determining the need for DPIAs, and differentiating the distinctions between risk assessment in Article 24 and Article 35 is a complex exercise. The nine A29 WP examples are quite broad and many IoT applications will likely require a DPIA. This is not necessarily a bad thing, as DPIAs are an important accountability mechanism providing for ‘building and demonstrating compliance’. Nonetheless, it is uncertain why Article 24 does not just state DPIAs are always necessary, as this seems to be the practical implication of A29 WP guidance. Implementation of appropriate technical and organisational measures. The language of ‘technical and organisational’ measures to demonstrate compliance in Article 24 closely aligns with Article 25 requirements on ‘data protection by design and default’ (DPbD). DPbD obliges data controllers to safeguard the freedoms and rights of individuals at the time of the determination of the means for processing and at the time of the processing itself. This may require minimising the processing of personal data, pseudonymising personal data as soon as possible, enabling transparency with regard to the functions and processing of personal data, and allowing the data subject to monitor data processing. In addition, by default, technical and organisational measures should be taken to ensure that: • Only personal data which are necessary for each specific purpose of processing are processed. • The amount of personal data collected, the extent of their processing, the period of their storage and their accessibility is controlled. • Personal data are not made accessible without the individual’s intervention to an indefinite number of natural persons. 45 p9-10 46 p4 47 p8 48 Recital 78, GDPR, 2016 In accordance with Article 24, taking into account the nature, scope, context, purposes and risks of processing, DPbD shall reflect the ‘state of the art’. This includes putting appropriate technological measures in place to demonstrate accountability and achieve compliance. Recital 63 GDPR, for example, states that in regard to data subject rights of access, “ ... where possible, the controller should be able to provide remote access to a secure system which would provide the data subject with direct access to his or her personal data.” We need to acknowledge, then, that GDPR invokes a turn to the systems design community to engage with data protection challenges, though we acknowledge the nature of design’s new, explicit role in data protection regulation remains unsettled. Processing is performed in accordance with this regulation. This requirement brings us full circle back to Article 5(2), that the controller shall demonstrate compliance with accountability, which in turn pulls on other GDPR provisions. The ‘lawful, fair and transparent’ principle in Article 5(1), for example, requires a turn to Ch II GDPR Articles on lawful processing (Article 6) and consent (Article 7), to name but two. When Article 5(2) is read in context of GDPR as a whole, we also need to examine the nature of data controller responsibility documented in Article 24. Upon doing this, the breadth of responsibilities implicated by the accountability principle become apparent. We believe it requires measures for compliance and subsequent demonstration with the entire GDPR. It is hard to isolate provisions of GPDR, as they often connect to and explicitly call on other provisions. This is clear with accountability, which starts as a narrow principle and grows in scope hugely as we dig deeper. Nevertheless, some elements of GDPR align more naturally with the principle. Two examples are transparency (Article 12) and record keeping (Article 30). Thus, accountability turns on the ability to question accounts provided by data controllers around their data handling practices. This requires that record keeping about data processing is in place to demonstrate that compliance with GDPR has been considered and actioned. Similarly, transparency is intrinsically linked to accountability. Transparency mainly focuses on communication by requiring that processing information be provided in clear, concise language which data subjects (and the public at large) can easily comprehend. As framed in GDPR, transparency is less about providing mechanisms to hold controllers to account. Instead it intersects with accountability by dictating the nature of account giving. ACCOUNTABILITY REQUIREMENTS Translation of the complex provisions of GDPR into more accessible principles is needed if IoT developers are to build accountability into the IoT. We thus propose 7 actionable accountability requirements, which seek to address key challenges occasioned by the IoT (as outlined above). These requirements highlight manifold ‘clusters’ of GDPR obligations. This clustering is not exhaustive, but given the broad nature of accountability, we think it provides a useful starting point for considering the nature of an account and substantive elements of GDPR data controllers need to comply with to demonstrate accountability, as outlined below and summarised in Table 1. Requirement 1: Limiting Initial Data Collection GDPR retains the classic data protection principles in Article 5(1) of ‘purpose limitation’, ‘data minimisation’ and ‘storage limitation’. According to GDPR, personal data should only be collected for ‘specified, explicit and legitimate purposes’ and not processed in a manner 49 Article 25(1), GDPR, 2016 incompatible with those initial purposes. Only what is ‘adequate, relevant and necessary’ for those initial purposes should be processed. Furthermore, the data should not be kept in a manner which identifies subjects (i.e., not anonymised) longer than necessary for these purposes. Strict oversight over what is being collected, why and how it is managed is necessary. Requirement 2: Limitations on International Data Transfer GDPR provides strict requirements on when personal data can be sent outside Europe. Article 44 states data should only be transferred to third countries on basis of various conditions. Article 45 states transfers can occur to countries deemed to provide adequate protection by the European Commission, including Uruguay, Israel or New Zealand, to name a few. Other grounds mandate that appropriate safeguards be put in place (Article 46), such as use of standard data protection contract clauses or binding corporate rules that govern data handling in an organisation (Article 47). The Privacy Shield 2016 agreement now covers data transfers to the USA. It requires companies apply the principles of notice and choice, and accountability for onward travel. Minimal oversight is provided by the US Department of Commerce. Requirement 3: Responding to the Spectrum of Control Rights GDPR provides a spectrum of new control rights around data processing, as described above in Articles 15 to 21. We frame these as rights users can escalate as needed from access (Article 15) to rectification (Article 16), objection (Article 21), restriction (Article 18), portability (Article 20) and ultimately erasure and the ‘right to be forgotten’ (Article 17). Requirement 4: Guaranteeing Greater Transparency Rights GDPR provides for increased transparency in the relationship between data controller and data subject. Information about processing, particularly concerning data subject rights, is to be provided in: ... concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child ... the information shall be provided in writing, or by other means, including, where appropriate, by electronic means. (Article 12). Controllers need to furnish the data subject with their identity, the purpose of and legal basis for processing, recipients of their data, and so forth (Article 13). They also need to maintain records of processing under their control, including the actors involved, the nature of processing, type of data collected, security measures taken, and so on (Article 30). The infamous Article 22 also tackles accountability in algorithms and profiling. It provides a right for data subjects not to be subject to decisions based solely on automated processing where the result has significant legal effects (e.g., refusal of credit) or similar (e.g., prejudice from 50 Art 5(1)(b) 51 Art 5(1)(c) 52 Art 5(1)(e) With exception of longer storage for archiving in the public interest, scientific or historical research or statistical purposes. 53 REF to current adequacy decisions 54 It replaces the Safe Harbor Agreement which was deemed inadequate due to the Schrems decision and Snowden revelations about mass surveillance programmes 55 REF algorithmic profiling). Measures to protect data subjects should be implemented, at minimum, by providing human oversight over such decisions and enabling subjects to voice concerns and contest outcomes. This assumes that the actions and concomitant reasoning of algorithms can be made accountable, a challenge in itself, particularly for machine learning algorithms deployed in conjunction with IoT devices. Requirement 5: Ensuring Lawfulness of Processing Consent is the most discussed grounds for lawful processing of personal data. As discussed in detail above, GDPR provides various requirements for consent mechanisms (see Articles 4, 7, 8 and 9), which are problematic for the IoT. However, consent is not the only basis for lawful processing. Article 6 includes other grounds, namely the legitimate interests of the data controller, the necessity of processing for performance of a contract the subject is party to, or for controller to satisfy a legal obligation they are subject to. Nonetheless, and insofar as the IoT finds its way at scale into consumer goods, consent will remain an important ingredient in ensuring the lawfulness of processing. Requirement 6: Protecting Data Storage and Security Numerous security and storage requirements exist in GDPR. Accuracy of data is key and appropriate security should be provided, particularly against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (Article 5). This is accompanied by Article 32 requirements to put in place appropriate technical and organisational measures for general security of processing drawing on pseudonymisation and encryption, regular security testing, ensuring resilience of services and timely restoration of data after an incident. GDPR also has strict breach notification provisions around information required and the timeframe for reporting, within 72 hours to authorities (Article 33). For data subjects, what is reported and when is more contingent on severity of breach (Article 34). Requirement 7: Articulating and Responding to Processing Responsibilities GDPR encourages the adoption of mechanisms for data controllers to articulate their responsibilities. Data Protection Impact Assessments have a key role to play in mapping risks, forecasting their likelihood of occurrence, considering appropriate safeguards, implementing these and making this process of reflection public (Article 35). In highlighting compliance with GDPR, an increased role is envisaged for certification processes, using seals and marks (Articles 42 and 43). Similarly, it is envisaged that new industry codes of conduct will emerge (Articles 40 and 41). In responding to established responsibilities, GDPR guides action by controllers. For organisations of a certain size, an appointed data protection officer will play a key internal oversight and guidance role (Articles 37 and 39). More generally, the turn to technical measures, encapsulated in Article 25 data protection by design and default is critical
منابع مشابه
شناسایی و رتبه بندی خدمات اینترنت اشیا در حوزه سلامت
Introduction: The Internet of Things is a system of connected physical objects that are accessible through the internet. It has been widely applied to connect available medical resources and provide reliable, effective and smart healthcare services to people. Therefore, the aim of this paper was to identify and rank the internet of things in healthcare services. Methods: In this applied resear...
متن کاملA Novel Trust Management Model in the Social Internet of Things
The Internet of Things (IoT) and social networking integration, create a new concept named Social Internet of Things (SIoT) according to which the things are able to autonomously establish social relationships with regard to the owners. Things in SIoT operate according to a service-oriented architecture. There may be misbehaving owners and consequently misbehaving devices that can perform harmf...
متن کاملA Survey of Anomaly Detection Approaches in Internet of Things
Internet of Things is an ever-growing network of heterogeneous and constraint nodes which are connected to each other and the Internet. Security plays an important role in such networks. Experience has proved that encryption and authentication are not enough for the security of networks and an Intrusion Detection System is required to detect and to prevent attacks from malicious nodes. In this ...
متن کاملInvestigating the Effect of Internet of Things on Human Resource Development and Training in the Organization (Case Study: State Airlines)
The Internet of Things is a new phenomenon that has changed the way we interact with our environment and affects all areas of life and the workplace. The purpose of this study is to investigate the effect of Internet of Things on the development and training of human resources in the organization. The present research is one of the applied researches and is considered as a descriptive-survey re...
متن کاملimprovement of Location-based Algorithm in the Internet of Things
Location Based Services (LBS) has become an important field of research with the rapid development of Internet-based Information Technology (IOT) technology and everywhere we use smartphones and social networks in our everyday lives. Although users can enjoy the flexibility, facility, facility and location-based services (LBS) with the Internet of Things, they may lose their privacy. An untrust...
متن کاملInvestigating the Effect of Internet of Things on Human Resource Development and Training in the Organization (Case Study: State Airlines)
The Internet of Things is a new phenomenon that has changed the way we interact with our environment and affects all areas of life and the workplace. The purpose of this study is to investigate the effect of Internet of Things on the development and training of human resources in the organization. The present research is one of the applied researches and is considered as a descriptive-survey re...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1801.07168 شماره
صفحات -
تاریخ انتشار 2018